Storkey Learning Rules for Hopfield Networks

نویسنده

  • Xiao Hu
چکیده

We summarize the Storkey Learning Rules for the Hopfield Model, and evaluate performance relative to other learning rules. Hopfield Models are normally used for auto-association, and Storkey Learning Rules have been found to have good balance between local learning and capacity. In this paper we outline different learning rules and summarise capacity results. Hopfield networks are related to Boltzmann Machines: they are the same as fully visible Boltzmann Machines in the zero temperature limit. Perhaps renewed interest in Boltzmann machines will produce renewed interest in Hopfield learning rules?

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Increasing the Capacity of a Hopfield Network without Sacrificing Functionality

Hoppeld networks are commonly trained by one of two algorithms. The simplest of these is the Hebb rule, which has a low absolute capacity of n=(2 ln n), where n is the total number of neurons. This capacity can be increased to n by using the pseudo-inverse rule. However, capacity is not the only consideration. It is important for rules to be local (the weight of a synapse depends ony on informa...

متن کامل

Stochastic Dynamics and High Capacity Associative Memories

The addition of noise to the deterministic Hopfield network, trained with one shot Hebbian learning, is known to bring benefits in the elimination of spurious attractors. This paper extends the analysis to learning rules that have a much higher capacity. The relative energy of desired and spurious attractors is reported and the affect of adding noise to the dynamics is empirically investigated....

متن کامل

High Performance Associative Memory Models and Symmetric Connections

Two existing high capacity training rules for the standard Hopfield architecture associative memory are examined. Both rules, based on the perceptron learning rule produce asymmetric weight matrices, for which the simple dynamics (only point attractors) of a symmetric network can no longer be guaranteed. This paper examines the consequences of imposing a symmetry constraint in learning. The mea...

متن کامل

Effectiveness of Neural Network Learning Rules Generated by a Biophysical Model of Synaptic Plasticity

We describe our initial attempts to reconcile powerful neural network learning rules derived from computational principles with learning rules derived “bottom-up” from biophysical mechanisms. Using a biophysical model of synaptic plasticity (Shouval, Bear, and Cooper, 2002), we generated numerical synaptic learning rules and compared them to the performance of a Hebbian learning rule in a previ...

متن کامل

High Performance Associative Memory and Sign Constraints

The consequences of imposing a sign constraint on the standard Hopfield architecture associative memory model, trained using perceptron like learning rules, is examined. Such learning rules have been shown to have capacity of at most half of their unconstrained versions. This paper reports experimental investigations into the consequences of constraining the sign of the network weights in terms...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013